This is a static quantized version of the PRIME-RL/P1-30B-A3B model, a large language model with 30 billion parameters, optimized specifically for fields such as physics, reinforcement learning, and competition reasoning, and supports English and multilingual processing.
Natural Language Processing
TransformersMultiple Languages